Neural Networks for Quaternion-valued Function Approximation
نویسندگان
چکیده
In the paper a new structure of Multi-Layer Perceptron, able to deal with quaternion-valued signal, is proposed. A learning algorithm for the proposed Quaternion MLP (QMLP) is also derived. Such a neural network allows to interpolate functions of quaternion variable with a smaller number of connections with respect to the corresponding real valued MLP. INTRODUCTION In the last few years, neural networks models have been used in a wide variety of applications, due to their capabilities of classification and interpolation of real valued functions. In particular, the capabilities of the multi-layer perceptron (MLP) structure have been widely investigated, proving the suitability of MLPs with only one hidden layer in the approximation of any continuous real-valued functions with an arbitrary degree of accuracy [ 11. More recently, complex MLP structures have been proposed [2], [3] and a suitable learning algorithms has been derived, in order to deal with complex-valued signals with lower computation complexity and fewer parameters, with respect to real MLPs. Two different structures of complex MLPs have been proposed in literature, differing for the type of activation function which is embedded into the neurons. In [4] it has been proved that the complex MLP proposed in [2], which adopts a bounded, non-analytic activation function can approximate with an arbitrary degree of accuracy, any continuous complex-valued function, while the network structure proposed in [3] can approximate only analytic complex valued function. A comparison of the complex and real structures in communications applications is reported in [5], showing that complex MLP structure can achieve the same performance of real MLP with a lower number of parameters and real multiplications in the feed-forward phase. These results leads to the conclusion that, when the intrinsic characteristic of the signals are embedded into the neural structure, the computational complexity decreases. Starting from this observation, the idea arise to extend MLP structure in hypercomplex domain. In 1843, the Irish mathematician W.R. Hamilton invented quaternion in order to extend 3dmensional vector algebra for inclusion of multiplications and division [6]. Quaternion are generalized complex numbers and represent rotation in space as ordinary complex numbers represent rotation in a plane. U sing quaternion to describe finite rotations brings attention to their capability to spec@ arbitrary rotations in space without degenerating to singularity [7]. The application of the quaternionic analysis to the problems of mathematical physics enable also to formulate a unified approach for solving all questions arising in the consideration of boundaq value problems PI. Moreover, in the context of molecular dynamics simulations, they have been rediscovered for the integration of the rotational equation of motion of rigid molecules [9]. In the paper a new multi-layer perceptron structure able to deal with quaternion is proposed. In particular, a suitable activation function for a quaternion neuron is considered and a recursive learning algorithm for updating the weight of the quaternion MLP (QMLP) is derived. In section I, the fundamental properties of quaternions are illustrated. In section I1 a list of notation adopted in quaternion MLP is described in order to derive the learning algorithm, which is reported in section 111. A numerical example is also reported in section IV in order to evaluate the suitability of the proposed approach, and a comparison between the QMLP and the real MLP is shown. QUATERNION ALGEBRA A quaternion 4 is defined as a complex number: formed from four different units (l,i , I , k) by means of the real parameters q, (i=0,..4), where r,J,k are the 3 orthogonal spatial vectors. It is convenient to represent if in the matrix form: 4 = q 0 + q , T + q 2 J + q ~ k = q 0 + ~
منابع مشابه
Multi-Valued Autoencoders and Classification of Large-Scale Multi-Class Problem
Two-layered neural networks are well known as autoencoders (AEs) in order to reduce the dimensionality of data. AEs are successfully employed as pre-trained layers of neural networks for classification tasks. Most of the existing studies conceived real-valued AEs in real-valued neural networks. This study investigated complexand quaternion-valued AEs for complexand quaternion-valued neural netw...
متن کاملComparison of the performances of neural networks specification, the Translog and the Fourier flexible forms when different production technologies are used
This paper investigates the performances of artificial neural networks approximation, the Translog and the Fourier flexible functional forms for the cost function, when different production technologies are used. Using simulated data bases, the author provides a comparison in terms of capability to reproduce input demands and in terms of the corresponding input elasticities of substitution esti...
متن کاملSTRUCTURAL DAMAGE DETECTION BY MODEL UPDATING METHOD BASED ON CASCADE FEED-FORWARD NEURAL NETWORK AS AN EFFICIENT APPROXIMATION MECHANISM
Vibration based techniques of structural damage detection using model updating method, are computationally expensive for large-scale structures. In this study, after locating precisely the eventual damage of a structure using modal strain energy based index (MSEBI), To efficiently reduce the computational cost of model updating during the optimization process of damage severity detection, the M...
متن کاملNeural Networks with Complex and Quaternion Inputs
Many neural network architectures operate only on real data and simple complex inputs. But there are applications where considerations of complex and quaternion inputs are quite desirable. Prior complex neural network models have generalized the Hopfield model, backpropagation and the perceptron learning rule to handle complex inputs. The Hopfield model for inputs and outputs falling on the uni...
متن کامل